# Parameter-shared architecture
Albert Xxlarge V2
Apache-2.0
ALBERT XXLarge v2 is a large language model pre-trained with masked language modeling objectives, featuring a parameter-shared Transformer architecture with 12 repeated layers and 223 million parameters.
Large Language Model English
A
albert
19.79k
20
Albert Xlarge V2
Apache-2.0
ALBERT XLarge v2 is an English pretrained model based on the Transformer architecture, employing parameter-sharing mechanisms to reduce memory usage, trained with masked language modeling and sentence order prediction objectives.
Large Language Model
Transformers English

A
albert
2,195
11
Featured Recommended AI Models